SRoUDA: Meta Self-Training for Robust Unsupervised Domain Adaptation

نویسندگان

چکیده

As acquiring manual labels on data could be costly, unsupervised domain adaptation (UDA), which transfers knowledge learned from a rich-label dataset to the unlabeled target dataset, is gaining increasingly more popularity. While extensive studies have been devoted improving model accuracy domain, an important issue of robustness neglected. To make things worse, conventional adversarial training (AT) methods for are inapplicable under UDA scenario since they train models examples that generated by supervised loss function. In this paper, we present new meta self-training pipeline, named SRoUDA, models. Based paradigm, SRoUDA starts with pre-training source applying baseline labeled and taraget developed random masked augmentation (RMA), then alternates between pseudo-labeled fine-tuning step. allows direct incorporation AT in UDA, step further helps mitigating error propagation noisy pseudo labels. Extensive experiments various benchmark datasets demonstrate state-of-the-art performance where it achieves significant improvement without harming clean accuracy.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asymmetric Tri-training for Unsupervised Domain Adaptation

Deep-layered models trained on a large number of labeled samples boost the accuracy of many tasks. It is important to apply such models to different domains because collecting many labeled samples in various domains is expensive. In unsupervised domain adaptation, one needs to train a classifier that works well on a target domain when provided with labeled source samples and unlabeled target sa...

متن کامل

Unsupervised Arabic Dialect Adaptation with Self-Training

Useful training data for automatic speech recognition systems of colloquial speech is usually limited to expensive in-domain transcription. Broadcast news is an appealing source of easily available data to bootstrap into a new dialect. However, some languages, like Arabic, have deep linguistic differences resulting in poor cross domain performance. If no in-domain transcripts are available, but...

متن کامل

An unsupervised deep domain adaptation approach for robust speech recognition

This paper addresses the robust speech recognition problem as a domain adaptation task. Specifically, we introduce an unsupervised deep domain adaptation (DDA) approach to acoustic modeling in order to eliminate the training–testing mismatch that is common in real-world use of speech recognition. Under a multi-task learning framework, the approach jointly learns two discriminative classifiers u...

متن کامل

Robust Unsupervised Domain Adaptation for Neural Networks via Moment Alignment

A novel approach for unsupervised domain adaptation for neural networks is proposed that relies on a metricbased regularization of the learning process. The metric-based regularization aims at domain-invariant latent feature representations by means of maximizing the similarity between domainspecific activation distributions. The proposed metric results from modifying an integral probability me...

متن کامل

Unsupervised Linguistically-Driven Reliable Dependency Parses Detection and Self-Training for Adaptation to the Biomedical Domain

In this paper, a new self–training method for domain adaptation is illustrated, where the selection of reliable parses is carried out by an unsupervised linguistically– driven algorithm, ULISSE. The method has been tested on biomedical texts with results showing a significant improvement with respect to considered baselines, which demonstrates its ability to capture both reliability of parses a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2023

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v37i3.25498